TRECVID 2007 Search Tasks by NUS-ICT

نویسندگان

  • Tat-Seng Chua
  • Shi-Yong Neo
  • Yantao Zheng
  • Hai-Kiat Goh
  • Xiaoming Zhang
  • Sheng Tang
  • Yongdong Zhang
  • Jintao Li
  • Juan Cao
  • Huan-Bo Luan
  • Qiao-Yan He
  • Xu Zhang
چکیده

This paper describes the details of our systems for our automated and interactive search in TRECVID 2007. The shift from news video to documentary video this year has prompted a series of changes in processing techniques from that developed over the past few years. For the automated search task, we employ our previous querydependent retrieval which automatically discovers query class and query-high-level-features (query-HLF) to fuse available multimodal features. Different from previous works, our system this year gives more emphasis to visual features such as color, texture and motion in the video source. The reasons are: (a) given the low quality of ASR text and the more visual and motion oriented queries, we expect the visual features to be as discriminating as text feature; and (b) the appropriate use of motion features is highly effective for queries as they are able to model intra-frame changes. For the interactive task, we first utilize the results from the automated search results for user feedback. The user is able to make use of our intuitive retrieval interface with a variety of relevance feedback techniques to refine the search results. In addition, we introduce the motion-icons, which allow users to see a dynamic series of keyframes instead of a single keyframe during assessment. Results show that the approach can help in providing better discrimination.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

TRECVID 2010 Known-item Search by NUS

This paper describes our system for auto search and interactive search in the known-item search (KIS) task in TRECVID 2010. KIS task aims to find an unique video answer for each text query. The shift from traditional video search has prompted a series of challenges in processing and searching techniques that developed over the past few years. For the automatic search task, our VisionGo system p...

متن کامل

News Video Retrieval by Learning Multimodal Semantic Information

With the explosion of multimedia data especially that of video data, requirement of efficient video retrieval has becoming more and more important. Years of TREC Video Retrieval Evaluation (TRECVID) research gives benchmark for video search task. The video data in TRECVID are mainly news video. In this paper a compound model consisting of several atom search modules, i.e., textual and visual, f...

متن کامل

MSRA-USTC-SJTU at TRECVID 2007: High-Level Feature Extraction and Search

This paper describes the MSRA-USTC-SJTU experiments for TRECVID 2007. We performed the experiments in high-level feature extraction and automatic search tasks. For high-level feature extraction, we investigated the benefit of unlabeled data by semi-supervised learning, and the multi-layer (ML) multi-instance (MI) relation embedded in video by MLMI kernel, as well as the correlations between con...

متن کامل

At & T Research at Trecvid 2006

TRECVID (TREC Video Retrieval Evaluation) is sponsored by NIST to encourage research in digital video indexing and retrieval. It was initiated in 2001 as a “video track” of TREC and became an independent evaluation in 2003. AT&T participated in three tasks in TRECVID 2006: shot boundary determination (SBD), search, and rushes exploitation. The proposed SBD algorithm contains a set of finite sta...

متن کامل

TRECVID 2007--Overview

The TREC Video Retrieval Evaluation (TRECVID) 2007 represents the seventh running of a TREC-style (trec.nist.gov) video retrieval evaluation, the goal of which remains to promote progress in content-based retrieval from digital video via open, metrics-based evaluation. Over time this effort should yield a better understanding of how systems can effectively accomplish such retrieval and how one ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007